|
In probability theory, two events ''R'' and ''B'' are conditionally independent given a third event ''Y'' precisely if the occurrence or non-occurrence of ''R'' ''and'' the occurrence or non-occurrence of ''B'' are independent events in their conditional probability distribution given ''Y''. In other words, ''R'' and ''B'' are conditionally independent given ''Y'' if and only if, given knowledge that ''Y'' occurs, knowledge of whether ''R'' occurs provides no information on the likelihood of ''B'' occurring, and knowledge of whether ''B'' occurs provides no information on the likelihood of ''R'' occurring. == Formal definition == In the standard notation of probability theory, ''R'' and ''B'' are conditionally independent given ''Y'' if and only if : or equivalently, : Two random variables ''X'' and ''Y'' are conditionally independent given a third random variable ''Z'' if and only if they are independent in their conditional probability distribution given ''Z''. That is, ''X'' and ''Y'' are conditionally independent given ''Z'' if and only if, given any value of ''Z'', the probability distribution of ''X'' is the same for all values of ''Y'' and the probability distribution of ''Y'' is the same for all values of ''X''. Two events ''R'' and ''B'' are conditionally independent given a σ-algebra Σ if : where denotes the conditional expectation of the indicator function of the event , , given the sigma algebra . That is, : Two random variables ''X'' and ''Y'' are conditionally independent given a σ-algebra ''Σ'' if the above equation holds for all ''R'' in σ(''X'') and B in σ(''Y''). Two random variables ''X'' and ''Y'' are conditionally independent given a random variable ''W'' if they are independent given σ(''W''): the σ-algebra generated by ''W''. This is commonly written: : or : This is read "X is independent of Y, given W"; the conditioning applies to the whole statement: "(X is independent of Y) given W". : If ''W'' assumes a countable set of values, this is equivalent to the conditional independence of ''X'' and ''Y'' for the events of the form (). Conditional independence of more than two events, or of more than two random variables, is defined analogously. The following two examples show that ''X'' (unicode:⊥) ''Y'' ''neither implies nor is implied by'' ''X'' (unicode:⊥) ''Y'' | ''W''. First, suppose ''W'' is 0 with probability 0.5 and 1 otherwise. When ''W'' = 0 take ''X'' and ''Y'' to be independent, each having the value 0 with probability 0.99 and the value 1 otherwise. When ''W'' = 1, ''X'' and ''Y'' are again independent, but this time they take the value 1 with probability 0.99. Then ''X'' (unicode:⊥) ''Y'' | ''W''. But ''X'' and ''Y'' are dependent, because Pr(''X'' = 0) < Pr(''X'' = 0|''Y'' = 0). This is because Pr(''X'' = 0) = 0.5, but if ''Y'' = 0 then it's very likely that ''W'' = 0 and thus that ''X'' = 0 as well, so Pr(''X'' = 0|''Y'' = 0) > 0.5. For the second example, suppose ''X'' (unicode:⊥) ''Y'', each taking the values 0 and 1 with probability 0.5. Let ''W'' be the product ''X''(unicode:×)''Y''. Then when ''W'' = 0, Pr(''X'' = 0) = 2/3, but Pr(''X'' = 0|''Y'' = 0) = 1/2, so ''X'' (unicode:⊥) ''Y'' | ''W'' is false. This is also an example of Explaining Away. See Kevin Murphy's tutorial 〔http://people.cs.ubc.ca/~murphyk/Bayes/bnintro.html〕 where ''X'' and ''Y'' take the values "brainy" and "sporty". 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Conditional independence」の詳細全文を読む スポンサード リンク
|